Learning rotations with little regret

نویسندگان
چکیده

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning Rotations Learning rotations with little regret

We describe online algorithms for learning a rotation from pairs of unit vectors in R. We show that the expected regret of our online algorithm compared to the best fixed rotation chosen offline over T iterations is O( √ nT ). We also give a lower bound that proves that this expected regret bound is optimal within a constant factor. This resolves an open problem posed in COLT 2008. Our online a...

متن کامل

Corrigendum to “ Learning rotations with little regret ” September 7 , 2010

There is an unfortunate error in our paper “Learning rotations with little regret” [HKW10] which appeared in COLT 2010. The sampling procedure for the noise matrix given in [HKW10] does not produce matrices with the right density. In this corrigendum, we describe the error, and give a correct sampling procedure. Unfortunately, even with the correct sampling procedure, the regret bound we get is...

متن کامل

Online Learning with Transductive Regret

We study online learning with the general notion of transductive regret, that is regret with modification rules applying to expert sequences (as opposed to single experts) that are representable by weighted finite-state transducers. We show how transductive regret generalizes existing notions of regret, including: (1) external regret; (2) internal regret; (3) swap regret; and (4) conditional sw...

متن کامل

Learning Rotations

Many different matrix classes have been tackled recently using online learning techniques, but at least one major class has been left out: rotations. We pose the online learning of rotations as an open problem and discuss the importance of this problem.

متن کامل

Big Learning with Little RAM

In large-scale machine learning, available memory (RAM) is often a key constraint, both during model training and when making new predictions. In this paper, we reduce memory cost by projecting our weight vector β ∈ R onto a coarse discrete set using randomized rounding. Because the values of the discrete set can be stored more compactly than standard 32-bit float encodings, this reduces RAM us...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Machine Learning

سال: 2016

ISSN: 0885-6125,1573-0565

DOI: 10.1007/s10994-016-5548-x